Tractable Bayesian density regression via logit stick-breaking priors

نویسندگان

چکیده

There is a growing interest in learning how the distribution of response variable changes with set observed predictors. Bayesian nonparametric dependent mixture models provide flexible approach to address this goal. However, several formulations require computationally demanding algorithms for posterior inference. Motivated by issue, we study class predictor-dependent infinite models, which relies on simple representation stick-breaking prior via sequential logistic regressions. This formulation maintains same desirable properties popular priors, and leverages recent Pólya-gamma data augmentation facilitate implementation computational methods These routines include Markov chain Monte Carlo Gibbs sampling, expectation–maximization algorithms, mean-field variational Bayes scalable inference, thereby stimulating wider density regression practitioners. The associated these are presented detail tested toxicology study.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gibbs Sampling Methods for Stick-Breaking Priors

A rich and  exible class of random probability measures, which we call stick-breaking priors, can be constructed using a sequence of independent beta random variables. Examples of random measures that have this characterization include the Dirichlet process, its two-parameter extension, the two-parameter Poisson–Dirichlet process, Ž nite dimensional Dirichlet priors, and beta two-parameter pro...

متن کامل

Bayesian estimation of discrete entropy with mixtures of stick-breaking priors

We consider the problem of estimating Shannon’s entropyH in the under-sampled regime, where the number of possible symbols may be unknown or countably infinite. Dirichlet and Pitman-Yor processes provide tractable prior distributions over the space of countably infinite discrete distributions, and have found major applications in Bayesian non-parametric statistics and machine learning. Here we ...

متن کامل

Learning concept graphs from text with stick-breaking priors

We present a generative probabilistic model for learning general graph structures, which we term concept graphs, from text. Concept graphs provide a visual summary of the thematic content of a collection of documents—a task that is difficult to accomplish using only keyword search. The proposed model can learn different types of concept graph structures and is capable of utilizing partial prior...

متن کامل

Variational Inference for Stick-Breaking Beta Process Priors

We present a variational Bayesian inference algorithm for the stick-breaking construction of the beta process. We derive an alternate representation of the beta process that is amenable to variational inference, and present a bound relating the truncated beta process to its infinite counterpart. We assess performance on two matrix factorization problems, using a non-negative factorization model...

متن کامل

Single Factor Transformation Priors for Density Regression

Although discrete mixture modeling has formed the backbone of the literature on Bayesian density estimation incorporating covariates, the use of discrete mixtures leads to some well known disadvantages. Avoiding discrete mixtures, we propose a flexible class of priors based on random nonlinear functions of a uniform latent variable with an additive residual. These priors are related to Gaussian...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Statistical Planning and Inference

سال: 2021

ISSN: ['1873-1171', '0378-3758']

DOI: https://doi.org/10.1016/j.jspi.2020.05.009